Upper Bounds on the Probability of Error in terms of Mean Divergence Measures

نویسنده

  • Inder Jeet Taneja
چکیده

In this paper we shall consider some famous means such as arithmetic, harmonic, geometric, root square mean, etc. Considering the difference of these means, we can establish [5, 6]. some inequalities among them. Interestingly, the difference of mean considered is convex functions. Applying some properties, upper bounds on the probability of error are established in this paper. It is also shown that the results obtained are sharper than obtained directly applying known inequalities.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalized Symmetric Divergence Measures and the Probability of Error

Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber [6], [8] J-divergence. SibsonBurbea-Rao [9], [3] Jensen-Shannon divegernce and Taneja [11] Arithmetic-Geometric divergence. These three measures bear an interesting relationship among each other. The divergence measures like Hellinger [5...

متن کامل

Exact maximum coverage probabilities of confidence intervals with increasing bounds for Poisson distribution mean

 ‎A Poisson distribution is well used as a standard model for analyzing count data‎. ‎So the Poisson distribution parameter estimation is widely applied in practice‎. ‎Providing accurate confidence intervals for the discrete distribution parameters is very difficult‎. ‎So far‎, ‎many asymptotic confidence intervals for the mean of Poisson distribution is provided‎. ‎It is known that the coverag...

متن کامل

Arimoto-Rényi conditional entropy and Bayesian hypothesis testing

This paper gives upper and lower bounds on the minimum error probability of Bayesian M -ary hypothesis testing in terms of the Arimoto-Rényi conditional entropy of an arbitrary order α. The improved tightness of these bounds over their specialized versions with the Shannon conditional entropy (α = 1) is demonstrated. In particular, in the case where M is finite, we show how to generalize Fano’s...

متن کامل

Generalization of Information Measures

| General formulas for entropy, mutual information, and divergence are established. It is revealed that these quantities are actually determined by three decisive sequences of random variables; which are, respectively, the normalized source information density, the normalized channel information density, and the normalized log-likelihood ratio. In terms of the ultimate cumulative distribution f...

متن کامل

Image Registration and Segmentation by Maximizing the Jensen-Rényi Divergence

Information theoretic measures provide quantitative entropic divergences between two probability distributions or data sets. In this paper, we analyze the theoretical properties of the Jensen-Rényi divergence which is defined between any arbitrary number of probability distributions. Using the theory of majorization, we derive its maximum value, and also some performance upper bounds in terms o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1103.5219  شماره 

صفحات  -

تاریخ انتشار 2011